SlideShare a Scribd company logo
1 of 123
Routine Data Quality Assessment (RDQA)

                                           Checklist to Assess Program/Project Data Quality


                                           Number of Regional Aggregation Sites                                      þÿ1

                                           Number of District Aggregation Sites                                      þÿ1

                                           Number of Service Delivery Sites                                          þÿ1




                                                                   Version: Jan 2010



Important notes for the use of this spreadsheet:


1. In order to use the Routine Data Quality Assessment tool you will need to ensure that your 'macro security' is set to something less than 'high'. With the spreadsheet open, go
to the 'Tools' pull-down menu and select 'Macro', then 'Security'. Select 'medium'. Close Excel and re-open the file. When you open the file the next time you will have to select
'Enable Macros' for the application to work as designed.

2. On the START Page (this page), please select number of intermediate aggregation sites (IAS) and Service Delivery Points (SDPs) that you plan to review from the dropdown
lists above. IAS are typically the district level health unit of the Ministry of Health.




                                                                                     START                                                                                    Page 1
B – INSTRUCTIONS FOR USE OF THE RDQA
1. Determine Purpose

The RDQA checklist can be used for:

T Initial assessment of M&E systems established by new implementing partners (or in decentralized systems) to collect, manage and report data.

I Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period
worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess
the functioning of the entire Program/project’s M&E system.

t Periodic assessment by donors of the quality of data being provided to them (this use of the DQA could be more frequent and more streamlined than official data quality audits that use the
DQA for Auditing) but less frequent than routine monitoring of data.

D Preparation for a formal data quality audit.

The RDQA is flexible for all of these uses. Countries and programs are encouraged to adapt the checklist to fit local program contexts.




2. Level/Site Selection
Select levels and sites to be included (depending on the purpose and resources available). Once the purpose has been determined, the second step in the RDQA is to decide what levels of
the data-collection and reporting system will be included in the assessment - service sites, intermediate aggregation levels, and/or central M&E unit. The levels should be determined once
the appropriate reporting levels have been identified and “mapped” (e.g., there are 100 sites providing the services in 10 districts. Reports from sites are sent to districts, which then send
aggregated reports to the M&E Unit). In some cases, the data flow will include more than one intermediate level (e.g. regions, provinces or states or multiple levels of program organizations).




3. Identify indicators, data sources and reporting period.                                                                                                                           The
RDQA is designed to assess the quality of data and underlying systems related to indicators that are reported to programs or donors. It is necessary to select one or more indicators – or at
least program areas – to serve as the subject of the RDQA. This choice will be based on the list of reported indicators. For example, a program focusing on treatment for HIV may report
indicators of numbers of people on ART. Another program may focus on meeting the needs of orphans or vulnerable children, therefore the indicators for that program would be from the OVC
program area. A malaria program might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria – or on both of those activities.




4. Conduct site visits. During the site visits, the relevant sections of the appropriate checklists in the Excel file are filled out (e.g. the service site checklist at service sites, etc). These
checklists are completed following interviews of relevant staff and reviews of site documentation. Using the drop down lists on the HEADER page of this workbook, select the appropriate
number of Intermediate Aggregation Levels (IAL) and Service Delivery Points (SDP) to be reviewed. The appropriate number of worksheets will automatically appear in the RDQA workbook
(up to 12 SDP and 4 IALs).

5. Review outputs and findings. The RDQAoutputs need to be reviewed for each site visited. Site-specific summary findings in the form of recommendations are noted at each site visited.




The RDQA checklists exist in MS Excel format and responses can be entered directly into the spreadsheets on the computer. Alternatively, the checklists can be printed and completed by
hand. When completed electronically, a dashboard produces graphics of summary statistics for each site and level of the reporting system.
The dashboard displays two (2) graphs for each site visited:


- A spider-graph displays qualitative data generated from the assessment of the data-collection and reporting system and can be used to prioritize areas for improvement.
- A bar-chart shows the quantitative data generated from the data verifications; these can be used to plan for data quality improvement.


 In addition, a 'Global Dashboard' shows statistics aggregated across and within levels to highlight overall strengths and weaknesses in the reporting system. The Global Dashboard shows a
spider graph for qualitative assessments and a bar chart for quantitative assessments as above. In addition, stengths and weakness of the reporting system are displayed as dimensions of
data quality in a 100% stacked bar chart. For this analysis questions are grouped by the applicable dimension of data quality (e.g. accuracy or reliability) and the number of responses by type
of response (e.g. 'Yes - completely', 'Partly' etc.) are plotted as a percentage of all responses. A table of survey questions and their associated dimensions of data quality can be found on the
'Dimensions of data quality' tab in this workbook.



6. Develop a system’s strengthening plan, including follow-up actions. The final output of the RDQA is an action plan for improving data quality which describes the identified
strengthening measures, the staff responsible, the timeline for completion, resources required and follow-up. Using the graphics and the detailed comments for each question, weak
performing functional areas of the reporting system can be identified. Program staff can then outline strengthening measures (e.g. training, data reviews), assign responsibilities and timelines
and identify resources using the Action Plan tab in this workbook.




                                                                                         INSTRUCTIONS                                                                                           Page 2
C – BACKGROUND INFORMATION – RDQA


Country:



Name of Program/project:



Indicator Reviewed:



Reporting Period Verified:



Assessment Team:                        Name                                                 Title                          Email

                Primary contact:




                                             M&E Management Unit at Central Level

                      Name of Site   Facility Code                                                                              Date (mm/dd/yy)

1-

                                                Regional Level Aggregation Sites

                      Name of Site   Facility Code                                                   Region   Region Code       Date (mm/dd/yy)

1

                                                 District Level Aggregation Sites

                      Name of Site   Facility Code          District         District Code           Region   Region Code       Date (mm/dd/yy)

1

                                                 Service Delivery Points (SDPs)

                      Name of Site   Facility Code          District         District Code           Region   Region Code       Date (mm/dd/yy)

1




                                                          Information_Page                                                                    Page 3
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -
                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                            N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                              -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 1                                                                                Page 4
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 1   Page 5
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                        1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                   1000%



                                                 2.00
    II- Indicator                                                                                       800%
                                                                             V - Links with
    Definitions and
                                                 1.00                        National
    Reporting
                                                                             Reporting System
    Guidelines
                                                                                                        600%
                                                 0.00



                                                                                                        400%




                                                                                                        200%
        III - Data-collection                                    IV- Data Management
        and Reporting Forms                                      Processes
        and Tools
                                                                                                          0%
                                                                                                                Verification Factor




                                                                                      Service Point 1                                                                                Page 6
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 2                                                                                Page 7
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 2   Page 8
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 2                                                                                   Page 9
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 3                                                                               Page 10
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 3   Page 11
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 3                                                                               Page 12
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 4                                                                               Page 13
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 4   Page 14
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 4                                                                               Page 15
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 5                                                                               Page 16
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 5   Page 17
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 5                                                                               Page 18
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 6                                                                               Page 19
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 6   Page 20
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                          Responsible(s)            Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                          Data and Reporting Verifications -
                                                                                                                                     Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 6                                                                               Page 21
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 7                                                                               Page 22
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 7   Page 23
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 7                                                                               Page 24
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 8                                                                               Page 25
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 8   Page 26
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 8                                                                               Page 27
Data Verification and System Assessment Sheet - Service Delivery Point

                                 Service Delivery Point/Organization:                                                                 -

                                                        Region and District:                                                          -

                                                        Indicator Reviewed:                                                           -

                                                              Date of Review:                                                         -

                                                Reporting Period Verified:                                                            -

                                                                                      Answer Codes:
                                                                                      Yes - completely                               REVIEWER COMMENTS
                  Component of the M&E System                                              Partly        (Please provide detail for each response not coded "Yes - Completely". Detailed
                                                                                     No - not at all                   responses will help guide strengthening measures. )
                                                                                             N/A




Part 1: Data Verifications

A - Documentation Review:

     Review availability and completeness of all indicator source documents for
     the selected reporting period.

     Review available source documents for the reporting period being verified. Is
     there any indication that source documents are missing?
1
     If yes, determine how this might have affected reported numbers.


     Are all available source documents complete?
2
     If no, determine how this might have affected reported numbers.


     Review the dates on the source documents. Do all dates fall within the
     reporting period?
3
     If no, determine how this might have affected reported numbers.


B - Recounting reported Results:

     Recount results from source documents, compare the verified numbers to the
     site reported numbers and explain discrepancies (if any).

     Recount the number of people, cases or events during the reporting period by
4
     reviewing the source documents. [A]

     Enter the number of people, cases or events reported by the site during the
5
     reporting period from the site summary report. [B]


6    Calculate the ratio of recounted to reported numbers. [A/B]                             -


     What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
     errors, arithmetic errors, missing source documents, other)?

C - Cross-check reported results with other data sources:

Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).


8    List the documents used for performing the cross-checks.


9    Describe the cross-checks performed?


10   What are the reasons for the discrepancy (if any) observed?




                                                                                     Service Point 9                                                                               Page 28
Part 2. Systems Assessment

     I - M&E Structure, Functions and Capabilities

       There are designated staff responsible for reviewing aggregated numbers
1      prior to submission to the next level (e.g., to districts, to regional offices, to
       the central M&E Unit).

       The responsibility for recording the delivery of services on source documents
2
       is clearly assigned to the relevant staff.

       All relevant staff have received training on the data management processes
3
       and tools.

     II- Indicator Definitions and Reporting Guidelines

The M&E Unit has provided written guidelines to each sub-reporting level on …


4       ,,, what they are supposed to report on.


5       … how (e.g., in what specific format) reports are to be submitted.


6       … to whom the reports should be submitted.


7       … when the reports are due.


     III - Data-collection and Reporting Forms and Tools

       Clear instructions have been provided by the M&E Unit on how to complete
8
       the data collection and reporting forms/tools.

       The M&E Unit has identified standard reporting forms/tools to be used by all
9
       reporting levels


10     ….The standard forms/tools are consistently used by the Service Delivery Site.

       All source documents and reporting forms relevant for measuring the
11     indicator(s) are available for auditing purposes (including dated print-outs in
       case of computerized system).

       The data collected on the source document has sufficient precision to
12     measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
       the indicator specifies desegregation by these characteristics).

     IV- Data Management Processes

       If applicable, there are quality controls in place for when data from paper-
13     based forms are entered into a computer (e.g., double entry, post-data entry
       verification, etc).

       If applicable, there is a written back-up procedure for when data entry or data
14
       processing is computerized.

         ….if yes, the latest date of back-up is appropriate given the frequency of
15
         update of the computerized system (e.g., back-ups are weekly or monthly).

       Relevant personal data are maintained according to national or international
16
       confidentiality guidelines.

       The recording and reporting system avoids double counting people within and
       across Service Delivery Points (e.g., a person receiving the same service
17
       twice in a reporting period, a person registered as receiving the same service
       in two different locations, etc).

       The reporting system enables the identification and recording of a "drop out",
18
       a person "lost to follow-up" and a person who died.

     V - Links with National Reporting System

       When available, the relevant national forms/tools are used for data-collection
19
       and reporting.

       When applicable, data are reported through a single channel of the national
20
       information systems.

       The system records information about where the service is delivered (i.e.
21
       region, district, ward, etc.)


22       ….if yes, place names are recorded using standarized naming conventions.




                                                                                            Service Point 9   Page 29
Part 3: Recommendations for the Service Site

       Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
       measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.

       Identified Weaknesses                                                           Description of Action Point                           Responsible(s)           Time Line


1

2

3

4




Part 4: DASHBOARD: Service Delivery Point



                  Data Management Assessment - Service Delivery Point                                                           Data and Reporting Verifications -
                                                                                                                                      Service Delivery Point
                                                                                                         1200%
                                   I - M&E Structure,
                                   Functions and Capabilities


                                                 3.00                                                    1000%



                                                 2.00
    II- Indicator                                                                                        800%
                                                                             V - Links with
    Definitions and
                                                                             National
    Reporting                                    1.00
                                                                             Reporting System
    Guidelines
                                                                                                         600%
                                                 0.00



                                                                                                         400%




                                                                                                         200%
        III - Data-collection                                     IV- Data Management
        and Reporting Forms                                       Processes
        and Tools
                                                                                                           0%
                                                                                                                 Verification Factor




                                                                                       Service Point 9                                                                               Page 30
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008

More Related Content

Similar to Routine data quality assessment tool june 2008

Requirement analysis
Requirement analysisRequirement analysis
Requirement analysiscsk selva
 
Chapter 11 Metrics for process and projects.ppt
Chapter 11  Metrics for process and projects.pptChapter 11  Metrics for process and projects.ppt
Chapter 11 Metrics for process and projects.pptssuser3f82c9
 
Ac2017 3. cast software-metricsincontracts
Ac2017   3. cast software-metricsincontractsAc2017   3. cast software-metricsincontracts
Ac2017 3. cast software-metricsincontractsNesma
 
Energy_Tracking Database_01_17_2014
Energy_Tracking Database_01_17_2014Energy_Tracking Database_01_17_2014
Energy_Tracking Database_01_17_2014Andrea Drabicki
 
Latest Innovations from Workday Analytics and Planning
Latest Innovations from Workday Analytics and PlanningLatest Innovations from Workday Analytics and Planning
Latest Innovations from Workday Analytics and PlanningWorkday, Inc.
 
CRASH Report 2014
CRASH Report 2014CRASH Report 2014
CRASH Report 2014CAST
 
Cards Performance Testing (Whitepaper)
Cards Performance Testing (Whitepaper)Cards Performance Testing (Whitepaper)
Cards Performance Testing (Whitepaper)Thinksoft Global
 
Project presentation
Project presentationProject presentation
Project presentationNiraj Bhujel
 
Increasing the probability of program success
Increasing the probability of program successIncreasing the probability of program success
Increasing the probability of program successGlen Alleman
 
Value Stream Mapping – Stories From the Trenches
Value Stream Mapping – Stories From the TrenchesValue Stream Mapping – Stories From the Trenches
Value Stream Mapping – Stories From the TrenchesDevOps.com
 
Snapshotz Assessments & Audits
Snapshotz Assessments & AuditsSnapshotz Assessments & Audits
Snapshotz Assessments & AuditsColin Taylor
 
Building New Institutional Capacity in M&E: The Experience of National AIDS C...
Building New Institutional Capacity in M&E: The Experience of National AIDS C...Building New Institutional Capacity in M&E: The Experience of National AIDS C...
Building New Institutional Capacity in M&E: The Experience of National AIDS C...MEASURE Evaluation
 
Middleware Soa Qualification Process Ver 2
Middleware Soa  Qualification Process Ver 2Middleware Soa  Qualification Process Ver 2
Middleware Soa Qualification Process Ver 2David Stephenson
 

Similar to Routine data quality assessment tool june 2008 (20)

Requirement analysis
Requirement analysisRequirement analysis
Requirement analysis
 
Chapter 11 Metrics for process and projects.ppt
Chapter 11  Metrics for process and projects.pptChapter 11  Metrics for process and projects.ppt
Chapter 11 Metrics for process and projects.ppt
 
Ac2017 3. cast software-metricsincontracts
Ac2017   3. cast software-metricsincontractsAc2017   3. cast software-metricsincontracts
Ac2017 3. cast software-metricsincontracts
 
Satish_Gudey
Satish_GudeySatish_Gudey
Satish_Gudey
 
Satish_Gudey
Satish_GudeySatish_Gudey
Satish_Gudey
 
Energy_Tracking Database_01_17_2014
Energy_Tracking Database_01_17_2014Energy_Tracking Database_01_17_2014
Energy_Tracking Database_01_17_2014
 
Software metrics
Software metricsSoftware metrics
Software metrics
 
CV_Sonali_Mane (1)
CV_Sonali_Mane (1)CV_Sonali_Mane (1)
CV_Sonali_Mane (1)
 
CV_Sonali_Mane
CV_Sonali_ManeCV_Sonali_Mane
CV_Sonali_Mane
 
Latest Innovations from Workday Analytics and Planning
Latest Innovations from Workday Analytics and PlanningLatest Innovations from Workday Analytics and Planning
Latest Innovations from Workday Analytics and Planning
 
CRASH Report 2014
CRASH Report 2014CRASH Report 2014
CRASH Report 2014
 
Performance Testing
Performance Testing Performance Testing
Performance Testing
 
Cards Performance Testing (Whitepaper)
Cards Performance Testing (Whitepaper)Cards Performance Testing (Whitepaper)
Cards Performance Testing (Whitepaper)
 
Project presentation
Project presentationProject presentation
Project presentation
 
Increasing the probability of program success
Increasing the probability of program successIncreasing the probability of program success
Increasing the probability of program success
 
Value Stream Mapping – Stories From the Trenches
Value Stream Mapping – Stories From the TrenchesValue Stream Mapping – Stories From the Trenches
Value Stream Mapping – Stories From the Trenches
 
Snapshotz Assessments & Audits
Snapshotz Assessments & AuditsSnapshotz Assessments & Audits
Snapshotz Assessments & Audits
 
Building New Institutional Capacity in M&E: The Experience of National AIDS C...
Building New Institutional Capacity in M&E: The Experience of National AIDS C...Building New Institutional Capacity in M&E: The Experience of National AIDS C...
Building New Institutional Capacity in M&E: The Experience of National AIDS C...
 
Middleware Soa Qualification Process Ver 2
Middleware Soa  Qualification Process Ver 2Middleware Soa  Qualification Process Ver 2
Middleware Soa Qualification Process Ver 2
 
Jagadeesh_Resume_5 + Years
Jagadeesh_Resume_5 + YearsJagadeesh_Resume_5 + Years
Jagadeesh_Resume_5 + Years
 

More from swiss1234

Sole solicitor qaq electronic version[sra]
Sole solicitor   qaq electronic version[sra]Sole solicitor   qaq electronic version[sra]
Sole solicitor qaq electronic version[sra]swiss1234
 
Bulgaria forum presentation
Bulgaria forum presentationBulgaria forum presentation
Bulgaria forum presentationswiss1234
 
Quality assurance 11x17
Quality assurance 11x17Quality assurance 11x17
Quality assurance 11x17swiss1234
 
New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06swiss1234
 
Financial services intermediaries quality assurance and tcf questionnaire[fsa]
Financial services intermediaries   quality assurance and tcf questionnaire[fsa]Financial services intermediaries   quality assurance and tcf questionnaire[fsa]
Financial services intermediaries quality assurance and tcf questionnaire[fsa]swiss1234
 
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008swiss1234
 
Quality assurance 11x17
Quality assurance 11x17Quality assurance 11x17
Quality assurance 11x17swiss1234
 
Quality assurance
Quality assuranceQuality assurance
Quality assuranceswiss1234
 
New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06swiss1234
 
Financial services intermediaries quality assurance and tcf questionnaire[fsa]
Financial services intermediaries   quality assurance and tcf questionnaire[fsa]Financial services intermediaries   quality assurance and tcf questionnaire[fsa]
Financial services intermediaries quality assurance and tcf questionnaire[fsa]swiss1234
 
Ec2 instace cost analysis
Ec2 instace cost analysisEc2 instace cost analysis
Ec2 instace cost analysisswiss1234
 
Pages bugs for deployment
Pages bugs for deploymentPages bugs for deployment
Pages bugs for deploymentswiss1234
 

More from swiss1234 (18)

Sole solicitor qaq electronic version[sra]
Sole solicitor   qaq electronic version[sra]Sole solicitor   qaq electronic version[sra]
Sole solicitor qaq electronic version[sra]
 
Bulgaria forum presentation
Bulgaria forum presentationBulgaria forum presentation
Bulgaria forum presentation
 
Sunimages
SunimagesSunimages
Sunimages
 
Satelite
SateliteSatelite
Satelite
 
Quality assurance 11x17
Quality assurance 11x17Quality assurance 11x17
Quality assurance 11x17
 
Pakistan
PakistanPakistan
Pakistan
 
New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06
 
Financial services intermediaries quality assurance and tcf questionnaire[fsa]
Financial services intermediaries   quality assurance and tcf questionnaire[fsa]Financial services intermediaries   quality assurance and tcf questionnaire[fsa]
Financial services intermediaries quality assurance and tcf questionnaire[fsa]
 
Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008Routine data quality assessment tool june 2008
Routine data quality assessment tool june 2008
 
Quality assurance 11x17
Quality assurance 11x17Quality assurance 11x17
Quality assurance 11x17
 
Quality assurance
Quality assuranceQuality assurance
Quality assurance
 
Pakistan
PakistanPakistan
Pakistan
 
New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06New taught programme accreditation flowchart dec06
New taught programme accreditation flowchart dec06
 
Financial services intermediaries quality assurance and tcf questionnaire[fsa]
Financial services intermediaries   quality assurance and tcf questionnaire[fsa]Financial services intermediaries   quality assurance and tcf questionnaire[fsa]
Financial services intermediaries quality assurance and tcf questionnaire[fsa]
 
Whwg
WhwgWhwg
Whwg
 
Ec2 instace cost analysis
Ec2 instace cost analysisEc2 instace cost analysis
Ec2 instace cost analysis
 
Pages bugs for deployment
Pages bugs for deploymentPages bugs for deployment
Pages bugs for deployment
 
Cdr pan sat
Cdr pan satCdr pan sat
Cdr pan sat
 

Routine data quality assessment tool june 2008

  • 1. Routine Data Quality Assessment (RDQA) Checklist to Assess Program/Project Data Quality Number of Regional Aggregation Sites þÿ1 Number of District Aggregation Sites þÿ1 Number of Service Delivery Sites þÿ1 Version: Jan 2010 Important notes for the use of this spreadsheet: 1. In order to use the Routine Data Quality Assessment tool you will need to ensure that your 'macro security' is set to something less than 'high'. With the spreadsheet open, go to the 'Tools' pull-down menu and select 'Macro', then 'Security'. Select 'medium'. Close Excel and re-open the file. When you open the file the next time you will have to select 'Enable Macros' for the application to work as designed. 2. On the START Page (this page), please select number of intermediate aggregation sites (IAS) and Service Delivery Points (SDPs) that you plan to review from the dropdown lists above. IAS are typically the district level health unit of the Ministry of Health. START Page 1
  • 2. B – INSTRUCTIONS FOR USE OF THE RDQA 1. Determine Purpose The RDQA checklist can be used for: T Initial assessment of M&E systems established by new implementing partners (or in decentralized systems) to collect, manage and report data. I Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess the functioning of the entire Program/project’s M&E system. t Periodic assessment by donors of the quality of data being provided to them (this use of the DQA could be more frequent and more streamlined than official data quality audits that use the DQA for Auditing) but less frequent than routine monitoring of data. D Preparation for a formal data quality audit. The RDQA is flexible for all of these uses. Countries and programs are encouraged to adapt the checklist to fit local program contexts. 2. Level/Site Selection Select levels and sites to be included (depending on the purpose and resources available). Once the purpose has been determined, the second step in the RDQA is to decide what levels of the data-collection and reporting system will be included in the assessment - service sites, intermediate aggregation levels, and/or central M&E unit. The levels should be determined once the appropriate reporting levels have been identified and “mapped” (e.g., there are 100 sites providing the services in 10 districts. Reports from sites are sent to districts, which then send aggregated reports to the M&E Unit). In some cases, the data flow will include more than one intermediate level (e.g. regions, provinces or states or multiple levels of program organizations). 3. Identify indicators, data sources and reporting period. The RDQA is designed to assess the quality of data and underlying systems related to indicators that are reported to programs or donors. It is necessary to select one or more indicators – or at least program areas – to serve as the subject of the RDQA. This choice will be based on the list of reported indicators. For example, a program focusing on treatment for HIV may report indicators of numbers of people on ART. Another program may focus on meeting the needs of orphans or vulnerable children, therefore the indicators for that program would be from the OVC program area. A malaria program might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria – or on both of those activities. 4. Conduct site visits. During the site visits, the relevant sections of the appropriate checklists in the Excel file are filled out (e.g. the service site checklist at service sites, etc). These checklists are completed following interviews of relevant staff and reviews of site documentation. Using the drop down lists on the HEADER page of this workbook, select the appropriate number of Intermediate Aggregation Levels (IAL) and Service Delivery Points (SDP) to be reviewed. The appropriate number of worksheets will automatically appear in the RDQA workbook (up to 12 SDP and 4 IALs). 5. Review outputs and findings. The RDQAoutputs need to be reviewed for each site visited. Site-specific summary findings in the form of recommendations are noted at each site visited. The RDQA checklists exist in MS Excel format and responses can be entered directly into the spreadsheets on the computer. Alternatively, the checklists can be printed and completed by hand. When completed electronically, a dashboard produces graphics of summary statistics for each site and level of the reporting system. The dashboard displays two (2) graphs for each site visited: - A spider-graph displays qualitative data generated from the assessment of the data-collection and reporting system and can be used to prioritize areas for improvement. - A bar-chart shows the quantitative data generated from the data verifications; these can be used to plan for data quality improvement. In addition, a 'Global Dashboard' shows statistics aggregated across and within levels to highlight overall strengths and weaknesses in the reporting system. The Global Dashboard shows a spider graph for qualitative assessments and a bar chart for quantitative assessments as above. In addition, stengths and weakness of the reporting system are displayed as dimensions of data quality in a 100% stacked bar chart. For this analysis questions are grouped by the applicable dimension of data quality (e.g. accuracy or reliability) and the number of responses by type of response (e.g. 'Yes - completely', 'Partly' etc.) are plotted as a percentage of all responses. A table of survey questions and their associated dimensions of data quality can be found on the 'Dimensions of data quality' tab in this workbook. 6. Develop a system’s strengthening plan, including follow-up actions. The final output of the RDQA is an action plan for improving data quality which describes the identified strengthening measures, the staff responsible, the timeline for completion, resources required and follow-up. Using the graphics and the detailed comments for each question, weak performing functional areas of the reporting system can be identified. Program staff can then outline strengthening measures (e.g. training, data reviews), assign responsibilities and timelines and identify resources using the Action Plan tab in this workbook. INSTRUCTIONS Page 2
  • 3. C – BACKGROUND INFORMATION – RDQA Country: Name of Program/project: Indicator Reviewed: Reporting Period Verified: Assessment Team: Name Title Email Primary contact: M&E Management Unit at Central Level Name of Site Facility Code Date (mm/dd/yy) 1- Regional Level Aggregation Sites Name of Site Facility Code Region Region Code Date (mm/dd/yy) 1 District Level Aggregation Sites Name of Site Facility Code District District Code Region Region Code Date (mm/dd/yy) 1 Service Delivery Points (SDPs) Name of Site Facility Code District District Code Region Region Code Date (mm/dd/yy) 1 Information_Page Page 3
  • 4. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 1 Page 4
  • 5. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 1 Page 5
  • 6. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and 1.00 National Reporting Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 1 Page 6
  • 7. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 2 Page 7
  • 8. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 2 Page 8
  • 9. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 2 Page 9
  • 10. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 3 Page 10
  • 11. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 3 Page 11
  • 12. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 3 Page 12
  • 13. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 4 Page 13
  • 14. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 4 Page 14
  • 15. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 4 Page 15
  • 16. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 5 Page 16
  • 17. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 5 Page 17
  • 18. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 5 Page 18
  • 19. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 6 Page 19
  • 20. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 6 Page 20
  • 21. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 6 Page 21
  • 22. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 7 Page 22
  • 23. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 7 Page 23
  • 24. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 7 Page 24
  • 25. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 8 Page 25
  • 26. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 8 Page 26
  • 27. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 8 Page 27
  • 28. Data Verification and System Assessment Sheet - Service Delivery Point Service Delivery Point/Organization: - Region and District: - Indicator Reviewed: - Date of Review: - Reporting Period Verified: - Answer Codes: Yes - completely REVIEWER COMMENTS Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed No - not at all responses will help guide strengthening measures. ) N/A Part 1: Data Verifications A - Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? 1 If yes, determine how this might have affected reported numbers. Are all available source documents complete? 2 If no, determine how this might have affected reported numbers. Review the dates on the source documents. Do all dates fall within the reporting period? 3 If no, determine how this might have affected reported numbers. B - Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). Recount the number of people, cases or events during the reporting period by 4 reviewing the source documents. [A] Enter the number of people, cases or events reported by the site during the 5 reporting period from the site summary report. [B] 6 Calculate the ratio of recounted to reported numbers. [A/B] - What are the reasons for the discrepancy (if any) observed (i.e., data entry 7 errors, arithmetic errors, missing source documents, other)? C - Cross-check reported results with other data sources: Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? Service Point 9 Page 28
  • 29. Part 2. Systems Assessment I - M&E Structure, Functions and Capabilities There are designated staff responsible for reviewing aggregated numbers 1 prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). The responsibility for recording the delivery of services on source documents 2 is clearly assigned to the relevant staff. All relevant staff have received training on the data management processes 3 and tools. II- Indicator Definitions and Reporting Guidelines The M&E Unit has provided written guidelines to each sub-reporting level on … 4 ,,, what they are supposed to report on. 5 … how (e.g., in what specific format) reports are to be submitted. 6 … to whom the reports should be submitted. 7 … when the reports are due. III - Data-collection and Reporting Forms and Tools Clear instructions have been provided by the M&E Unit on how to complete 8 the data collection and reporting forms/tools. The M&E Unit has identified standard reporting forms/tools to be used by all 9 reporting levels 10 ….The standard forms/tools are consistently used by the Service Delivery Site. All source documents and reporting forms relevant for measuring the 11 indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). The data collected on the source document has sufficient precision to 12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). IV- Data Management Processes If applicable, there are quality controls in place for when data from paper- 13 based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). If applicable, there is a written back-up procedure for when data entry or data 14 processing is computerized. ….if yes, the latest date of back-up is appropriate given the frequency of 15 update of the computerized system (e.g., back-ups are weekly or monthly). Relevant personal data are maintained according to national or international 16 confidentiality guidelines. The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service 17 twice in a reporting period, a person registered as receiving the same service in two different locations, etc). The reporting system enables the identification and recording of a "drop out", 18 a person "lost to follow-up" and a person who died. V - Links with National Reporting System When available, the relevant national forms/tools are used for data-collection 19 and reporting. When applicable, data are reported through a single channel of the national 20 information systems. The system records information about where the service is delivered (i.e. 21 region, district, ward, etc.) 22 ….if yes, place names are recorded using standarized naming conventions. Service Point 9 Page 29
  • 30. Part 3: Recommendations for the Service Site Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Identified Weaknesses Description of Action Point Responsible(s) Time Line 1 2 3 4 Part 4: DASHBOARD: Service Delivery Point Data Management Assessment - Service Delivery Point Data and Reporting Verifications - Service Delivery Point 1200% I - M&E Structure, Functions and Capabilities 3.00 1000% 2.00 II- Indicator 800% V - Links with Definitions and National Reporting 1.00 Reporting System Guidelines 600% 0.00 400% 200% III - Data-collection IV- Data Management and Reporting Forms Processes and Tools 0% Verification Factor Service Point 9 Page 30